Large margin classifiers based on affine hulls

نویسندگان

  • Hakan Cevikalp
  • Bill Triggs
  • Hasan Serhan Yavuz
  • Yalçin Küçük
  • Mahide Küçük
  • Atalay Barkana
چکیده

This paper introduces a geometrically inspired large-margin classifier that can be a better alternative to the Support Vector Machines (SVMs) for the classification problems with limited number of training samples. In contrast to the SVM classifier, we approximate classes with affine hulls of their class samples rather than convex hulls. For any pair of classes approximated with affine hulls, we introduce two solutions to find the best separating hyperplane between them. In the first proposed formulation, we compute the closest points on the affine hulls of classes and connect these two points with a line segment. The optimal separating hyperplane between the two classes is chosen to be the hyperplane that is orthogonal to the line segment and bisects the line. The second formulation is derived by modifying the ν-SVM formulation. Both formulations are extended to the nonlinear case by using the kernel trick. Based on our findings, we also develop a geometric Email addresses: [email protected] (Hakan Cevikalp), [email protected] (Bill Triggs), [email protected] (Hasan Serhan Yavuz), [email protected] (Yalcin Kucuk), [email protected] (Mahide Kucuk), [email protected] (Atalay Barkana) Preprint submitted to Neurocomputing April 19, 2010 interpretation of the Least Squares SVM classifier and show that it is a special case of the proposed method. Multi-class classification problems are dealt with constructing and combining several binary classifiers as in SVM. The experiments on several databases show that the proposed methods work as good as the SVM classifier if not any better.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Escaping the Convex Hull with Extrapolated Vector Machines

Maximum margin classifiers such as Support Vector Machines (SVMs) critically depends upon the convex hulls of the training samples of each class, as they implicitly search for the minimum distance between the convex hulls . We propose Extrapolated Vector Machines (XVMs) which rely on extrapolations outside these convex hulls. XVMs improve SVM generalization very significantly on the MNIST [7] O...

متن کامل

Towards A Deeper Geometric, Analytic and Algorithmic Understanding of Margins

Given a matrix A, a linear feasibility problem (of which linear classification is a special case) aims to find a solution to a primal problem w : ATw > 0 or a certificate for the dual problem which is a probability distribution p : Ap = 0. Inspired by the continued importance of “large-margin classifiers” in machine learning, this paper studies a condition measure of A called its margin that de...

متن کامل

Hard or Soft Classification? Large-margin Unified Machines.

Margin-based classifiers have been popular in both machine learning and statistics for classification problems. Among numerous classifiers, some are hard classifiers while some are soft ones. Soft classifiers explicitly estimate the class conditional probabilities and then perform classification based on estimated probabilities. In contrast, hard classifiers directly target on the classificatio...

متن کامل

AN OBSERVER-BASED INTELLIGENT DECENTRALIZED VARIABLE STRUCTURE CONTROLLER FOR NONLINEAR NON-CANONICAL NON-AFFINE LARGE SCALE SYSTEMS

In this paper, an observer based fuzzy adaptive controller (FAC) is designed fora class of large scale systems with non-canonical non-affine nonlinear subsystems. It isassumed that functions of the subsystems and the interactions among subsystems areunknown. By constructing a new class of state observer for each follower, the proposedconsensus control method solves the problem of unmeasured sta...

متن کامل

A Note on Extending Generalization Bounds for Binary Large-Margin Classifiers to Multiple Classes

A generic way to extend generalization bounds for binary large-margin classifiers to large-margin multi-category classifiers is presented. The simple proceeding leads to surprisingly tight bounds showing the same Õ(d) scaling in the number d of classes as state-of-the-art results. The approach is exemplified by extending a textbook bound based on Rademacher complexity, which leads to a multi-cl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 73  شماره 

صفحات  -

تاریخ انتشار 2010